Nonlinear Robust Regression Using Kernel Principal Component Analysis and R-Estimators

نویسندگان

  • Antoni Wibowo
  • Mohammad Ishak Desa
چکیده

In recent years, many algorithms based on kernel principal component analysis (KPCA) have been proposed including kernel principal component regression (KPCR). KPCR can be viewed as a non-linearization of principal component regression (PCR) which uses the ordinary least squares (OLS) for estimating its regression coefficients. We use PCR to dispose the negative effects of multicollinearity in regression models. However, it is well known that the main disadvantage of OLS is its sensitiveness to the presence of outliers. Therefore, KPCR can be inappropriate to be used for data set containing outliers. In this paper, we propose a novel nonlinear robust technique using hybridization of KPCA and R-estimators. The proposed technique is compared to KPCR and gives better results than KPCR.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Principal Component Analysis Using Statistical Estimators

Principal Component Analysis (PCA) finds a linear mapping and maximizes the variance of the data which makes PCA sensitive to outliers and may cause wrong eigendirection. In this paper, we propose techniques to solve this problem; we use the data-centering method and reestimate the covariance matrix using robust statistic techniques such as median, robust scaling which is a booster to datacente...

متن کامل

Kernel Principal Component Ranking: Robust Ranking on Noisy Data

We propose the kernel principal component ranking algorithm (KPCRank) for learning preference relations. The algorithm can be considered as an extension of nonlinear principal component regression applicable to preference learning task. It is particularly suitable for learning from noisy datasets where a lower dimensional data representation preserves most expressive features. In many cases nea...

متن کامل

Kernel ridge vs. principal component regression: minimax bounds and adaptability of regularization operators

Regularization is an essential element of virtually all kernel methods for nonparametric regressionproblems. A critical factor in the effectiveness of a given kernel method is the type of regularizationthat is employed. This article compares and contrasts members from a general class of regularizationtechniques, which notably includes ridge regression and principal component reg...

متن کامل

An application of principal component analysis and logistic regression to facilitate production scheduling decision support system: an automotive industry case

Production planning and control (PPC) systems have to deal with rising complexity and dynamics. The complexity of planning tasks is due to some existing multiple variables and dynamic factors derived from uncertainties surrounding the PPC. Although literatures on exact scheduling algorithms, simulation approaches, and heuristic methods are extensive in production planning, they seem to be ineff...

متن کامل

Kernel PCA for Feature Extraction and De - Noising in 34 Nonlinear Regression

39 40 41 In this paper, we propose the application of the 42 Kernel Principal Component Analysis (PCA) tech43 nique for feature selection in a high-dimensional 44 feature space, where input variables are mapped by 45 a Gaussian kernel. The extracted features are 46 employed in the regression problems of chaotic 47 Mackey–Glass time-series prediction in a noisy 48 environment and estimating huma...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011